• The Italian prime minister is seeking over $100,000 in damages for deepfake porn videos of her.
  • Giorgia Meloni's lawyer says they hope it will encourage other female victims to press charges.
  • But many victims of deepfake porn never get justice, and the US has no federal law against it.

Italy's prime minister, Giorgia Meloni, is seeking more than $100,000 in damages after deepfake porn videos of her were uploaded online.

According to several media outlets, police believe a 40-year-old man created the deepfake videos of Meloni, with his 73-year-old father also under investigation.

Deepfake videos use a form of AI called deep learning to either create completely computer-generated footage or superimpose a likeness onto existing videos.

Primarily targeting women, the majority of these videos are pornographic and nonconsensual, according to a 2019 analysis of thousands of deepfake videos by Deeptrace Labs.

Taylor Swift is among the recent victims of it.

These kinds of videos can also pose an increasing threat as a means of political misinformation.

According to The Independent, detectives working on the Meloni case were able to trace the cellphone used to upload the deepfake videos.

The videos surfaced on a US porn website in 2020, before Meloni became prime minister of Italy in 2022, and amassed millions of views over several months, MailOnline reported.

In her civil suit, Meloni is seeking a "symbolic" sum of €100,000 in damages, about $105,000 in US currency, her lawyer Maria Giulia Marongiu told BBC News.

Should the legal efforts be successful, Marongiu, who did not immediately respond to a request for comment from BI, said the money would be donated to female victims of male violence.

She also told BBC News that it would "send a message to women who are victims of this kind of abuse of power not to be afraid to press charges."

The Independent reported that the pair under investigation are accused of defamation, which can result in prison time under Italian law.

Meloni will testify in court on July 2, according to reports.

The case is a rare example of someone pursuing justice over deepfake videos.

In the US, it's often not fear that prevents women from seeking recourse but rather the limited options available.

According to the AP, at least 10 states have enacted laws against deepfakes, but there are currently no federal laws criminalizing the creation or sharing of deepfake porn.

In January, a bipartisan group of senators introduced a federal bill known as the DEFIANCE Act in an effort to remedy this. The proposed legislation would allow victims to sue those who create and disseminate sexually explicit deepfakes of them.

Last month, hundreds of US academics, politicians, and AI leaders also signed an open letter raising concerns over non-consensual or grossly misleading AI-generated content that a reasonable person would mistake as real.

It described deepfakes as "a growing threat to society," and said that governments need to take action to stop them.

According to an investigation by WIRED, thousands of women have found a workaround in the meantime — using copyright claims against websites that share deepfake videos of them.

Read the original article on Business Insider